Regularizing with Bregman--Moreau Envelopes

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

From Eckart and Young approximation to Moreau envelopes and vice versa

In matricial analysis, the theorem of Eckart & Young provides a best approximation of an arbitrary matrix by a matrix of rank at most r. In variational analysis or optimization, the Moreau envelopes are appropriate ways of approximating or regularizing the rank function. We prove here that we can go forwards and backwards between the two procedures, thereby showing that they carry essentially t...

متن کامل

Submodular-Bregman and the Lovász-Bregman Divergences with Applications

We introduce a class of discrete divergences on sets (equivalently binary vectors) that we call the submodular-Bregman divergences. We consider two kinds, defined either from tight modular upper or tight modular lower bounds of a submodular function. We show that the properties of these divergences are analogous to the (standard continuous) Bregman divergence. We demonstrate how they generalize...

متن کامل

Clustering with Bregman Divergences

A wide variety of distortion functions, such as squared Euclidean distance, Mahalanobis distance, Itakura-Saito distance and relative entropy, have been used for clustering. In this paper, we propose and analyze parametric hard and soft clustering algorithms based on a large class of distortion functions known as Bregman divergences. The proposed algorithms unify centroid-based parametric clust...

متن کامل

Regularizing AdaBoost

AdaBoost is an iterative algorithm to constructclassifier ensembles. It quickly achieves high accuracy by focusingon objects that are difficult to classify. Because of this, AdaBoosttends to overfit when subjected to noisy datasets. We observethat this can be partially prevented with the use of validationsets, taken from the same noisy training set. But using less thanth...

متن کامل

Regularizing AdaBoost

Boosting methods maximize a hard classiication margin and are known as powerful techniques that do not exhibit overrtting for low noise cases. Also for noisy data boosting will try to enforce a hard margin and thereby give too much weight to outliers, which then leads to the dilemma of non-smooth ts and overrtting. Therefore we propose three algorithms to allow for soft margin classiication by ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: SIAM Journal on Optimization

سال: 2018

ISSN: 1052-6234,1095-7189

DOI: 10.1137/17m1130745